# Vocabulary compression
Bloom 1b4 Zh
Openrail
Chinese language model developed based on bigscience/bloom-1b7 architecture with 1.4B parameters, reducing GPU memory usage through vocabulary compression
Large Language Model
Transformers Chinese

B
Langboat
5,157
18
Rut5 Base
MIT
A streamlined version based on google/mt5-base, optimized for Russian and English with 58% fewer parameters
Large Language Model Supports Multiple Languages
R
cointegrated
27.85k
11
Featured Recommended AI Models